51 research outputs found

    Narrowband Interference Suppression in Wireless OFDM Systems

    Full text link
    Signal distortions in communication systems occur between the transmitter and the receiver; these distortions normally cause bit errors at the receiver. In addition interference by other signals may add to the deterioration in performance of the communication link. In order to achieve reliable communication, the effects of the communication channel distortion and interfering signals must be reduced using different techniques. The aim of this paper is to introduce the fundamentals of Orthogonal Frequency Division Multiplexing (OFDM) and Orthogonal Frequency Division Multiple Access (OFDMA), to review and examine the effects of interference in a digital data communication link and to explore methods for mitigating or compensating for these effects

    Interference Mapping in 3D for High-Density Indoor IoT Deployments

    Get PDF
    Deployment of practical Internet of Things (IoT) in the context of 5G can be hindered by substantial interference and spectrum limitations, especially in the unlicensed frequency bands. Due to the high density of such devices in indoor scenarios, the need for interference characterization which facilitates more effective spectrum utilization is further emphasized. This chapter studies the influence of diverse scenarios for the dense placement of interferers on the spectrum occupancy through the use of 3D interference maps for two popular IoT technologies—LoRa and Wi-Fi. The experiments are performed with software-defined radio (SDR) platforms in real time and an automated positioning tool which provides the measurements to characterize the interference in 3D space. The findings demonstrate a nonuniform character of the interference and the significant impact of fading within the width, height, and length of the examined area. They suggest the role of dynamic relocation for realistic IoT scenarios

    Machine Learning Enabled Performance Prediction Model for Massive-MIMO HetNet System

    No full text
    To support upcoming novel applications, fifth generation (5G) and beyond 5G (B5G) wireless networks are being propelled to deploy an ultra-dense network with an ultra-high spectral efficiency using the combination of heterogeneous network (HetNet) solutions and massive Multiple Input Multiple Output (MIMO). As the deployment of massive MIMO HetNet systems involves a high capital expenditure, network service providers need a precise performance analysis before investment. The performance of such networks is limited because of presence of inter-cell and inter-tier interferences. The conventional analytic approach to model the performance of such networks is not trivial, as the performance is a stochastic function of many network parameters. This paper proposes a machine learning (ML) approach to predict the network performance of a massive MIMO HetNet system considering a multi-cell scenario. This paper considers a two-tier network in which the base stations of each tier are equipped with massive MIMO systems working in a sub 6GHz band. The coverage probability (CP) and area spectral efficiency (ASE) are considered to be the network performance metrics that quantify the reliability and achievable rate in the network, respectively. Here, an ML model is inferred to predict the numerical values of the performance metrics for an arbitrary network configuration. In the process of practical deployments of future networks, the use of this model could be very valuable

    Machine Learning Enabled Performance Prediction Model for Massive-MIMO HetNet System

    No full text
    To support upcoming novel applications, fifth generation (5G) and beyond 5G (B5G) wireless networks are being propelled to deploy an ultra-dense network with an ultra-high spectral efficiency using the combination of heterogeneous network (HetNet) solutions and massive Multiple Input Multiple Output (MIMO). As the deployment of massive MIMO HetNet systems involves a high capital expenditure, network service providers need a precise performance analysis before investment. The performance of such networks is limited because of presence of inter-cell and inter-tier interferences. The conventional analytic approach to model the performance of such networks is not trivial, as the performance is a stochastic function of many network parameters. This paper proposes a machine learning (ML) approach to predict the network performance of a massive MIMO HetNet system considering a multi-cell scenario. This paper considers a two-tier network in which the base stations of each tier are equipped with massive MIMO systems working in a sub 6GHz band. The coverage probability (CP) and area spectral efficiency (ASE) are considered to be the network performance metrics that quantify the reliability and achievable rate in the network, respectively. Here, an ML model is inferred to predict the numerical values of the performance metrics for an arbitrary network configuration. In the process of practical deployments of future networks, the use of this model could be very valuable
    • …
    corecore